379 research outputs found

    Random ambience using high fidelity images

    Get PDF
    Most of the secure communication nowadays mandates true random keys as an input. These operations are mostly designed and taken care of by the developers of the cryptosystem. Due to the nature of confidential crypto development today, pseudorandom keys are typically designed and still preferred by the developers of the cryptosystem. However, these pseudorandom keys are predictable, periodic and repeatable, hence they carry minimal entropy. True random keys are believed to be generated only via hardware random number generators. Careful statistical analysis is still required to have any confidence the process and apparatus generates numbers that are sufficiently random to suit the cryptographic use. In this underlying research, each moment in life is considered unique in itself. The random key is unique for the given moment generated by the user whenever he or she needs the random keys in practical secure communication. An ambience of high fidelity digital image shall be tested for its randomness according to the NIST Statistical Test Suite. Recommendation on generating a simple 4 megabits per second random cryptographic keys live shall be reported

    Mapping Process of Digital Forensic Investigation Framework

    Get PDF
    Digital forensics is essential for the successful prosecution of digital criminals which involve diverse digital devices such as computer system devices, network devices, mobile devices and storage devices. The digital forensic investigation must be retrieved to obtain the evidence that will be accepted in the court of law. Therefore, for digital forensic investigation to be performed successfully, there are a number of important steps that have to be taken into consideration. The aim of this paper is to produce the mapping process between the processes/activities and output for each phase in Digital Forensic Investigation Framework (DFIF). Existing digital forensic frameworks will be reviewed and then the mapping is constructed. The result from the mapping process will provide a new framework to optimize the whole investigation process

    Intrusion Alert Correlation Technique Analysis for Heterogeneous Log

    Get PDF
    Intrusion alert correlation is multi-step processes that receives alerts from heterogeneous log resources as input and produce a high-level description of the malicious activity on the network. The objective of this study is to analyse the current alert correlation technique and identify the significant criteria in each technique that can improve the Intrusion Detection System(IDS) problem such as prone to alert flooding, contextual problem, false alert and scalability. The existing alert correlation techniques had been reviewed and analysed. From the analysis, six capability criteria have been identified to improve the current alert correlation technique. They are capability to do alert reduction, alert clustering,identify multistep attack, reduce false alert, detect known attack and detect unknown attack

    Building of the Enabled Web-Based GIS Participation System: A Tool to Enhance Community Participation in City Development Plan

    Get PDF
    This research has been conducted under the frame work of GIS interoperability infrastructure financed by the Malaysian Ministry of Science and Technology. In this paper, we discuss the development and analyze the potential application of the Geographical Information System (GIS) and Internet computing to enhance community participation in the decision making processes i.e. in Local Plan of Block 3, Melaka Tengah district as a model. The focus of this paper is on the designing and development of the enabled Web-based GIS Participation System called, GISPSS. It is considered to be similar to a Real Time Resources Discovery Server but it comprises four main complimentary components: the Map Viewer, the Objection Support, the Planning Process Documentation, and the Web Content Management. The functional features of these components are highlighted. The multi-tiered architecture forms the basis of the Enabled Webbased GIS system has been adopted in this research

    Analysis on Differential Router Buffer Size towards Network Congestion: A Simulation-based

    Get PDF
    Network resources are shared amongst a large number of users. Improper managing network traffic leads to congestion problem that degrades a network performance. It happens when the traffic exceeds the network capacity. In this research, we plan to observe the value of buffer size that contributes to network congestion. A simulation study by using OPNET Modeler 14.5 is conducted to achieve the purpose. A simple dumb-bell topology is used to observe several parameter such as number of packet dropped, retransmission count, end-to-end TCP delay, queuing delay and link utilization. The results show that the determination of buffer size based on Bandwidth-Delay Product (BDP) is still applicable for up to 500 users before network start to be congested. The symptom of near-congestion situation also being discussed corresponds to simulation results. Therefore, the buffer size needs to be determined to optimize the network performance based on our network topology. In future, the extension study will be carried out to investigate the effect of other buffer size models such as Stanford Model and Tiny Buffer Model. In addition, the buffer size has to be determined for wireless environment later on

    Congestive Loss in Wireless Ad hoc Network: Network Performance Analysis

    Get PDF
    Communication in wireless network is quite susceptible to mobility, nodes capacity and power consumption level. These might contributes to the major problem of TCP performance degradation where there are highly potential of packet loss and packet reordering. In this research, we manage to observe the impact of packet behavior once the node’s capacity is limited when passing on-going data. This condition occurs when the node’s buffer starts to be overloaded. A simulation study by using OPNET Modeler 14.5 is conducted to achieve the purpose. A static ad hoc topology with the size of users (2n where n=0, 1, 2, 3 and 4) is used to observe several parameters such as throughput, number of packet dropped, retransmission count and end-to-end TCP delay. The results show that the size of buffer for ad hoc node influence the network performance whenever number of users is changed. In future, we plan to extend this study in a way of deeply understanding the effect of mobility in wireless network

    Alert Correlation Technique Analysis For Diverse Log

    Get PDF
    Alert correlation is a process that analyses the alerts produced by one or more diverse devices and provides a more succinct and high-level view of occurring or attempted intrusions. The objective of this study is to analyse the current alert correlation technique and identify the significant criteria in each technique that can improve the Intrusion Detection System IDS) problem such as prone to alert flooding, contextual problem, false alert and scalability. The existing alert correlation techniques had been reviewed and analysed. From the analysis, six capability criteria have been identified to improve the current alert correlation techniques which are capability to do alert reduction, alert clustering, identify multi-step attack,reduce false alert, detect known attack and detect unknown attack and technique’s combination is proposed

    Gender voice classification with huge accuracy rate

    Get PDF
    Gender voice recognition stands for an imperative research field in acoustics and speech processing as human voice shows very remarkable aspects. This study investigates speech signals to devise a gender classifier by speech analysis to forecast the gender of the speaker by investigating diverse parameters of the voice sample. A database has 2270 voice samples of celebrities, both male and female. Through Mel frequency cepstrum coefficient (MFCC), vector quantization (VQ), and machine learning algorithm (J 48), an accuracy of about 100% is achieved by the proposed classification technique based on data mining and Java script

    Local Dependence for Bivariate Weibull Distributions Created by Archimedean Copula

    Get PDF
    في تحليل البقاء على قيد الحياة متعدد المتغيرات ، يعد تقدير دالة التوزيع متعدد المتغيرات و من ثم قياس علاقة و ارتباط بين أوقات البقاء ذات أهمية كبيرة. تُستخدم دالات الكوبيلا ، مثل ارخميديان كوبيلا، بشكل شائع لتقدير توزيعات المتغيرات غير المعروفة بناءً على الدوال الهامشية المعروفة. في هذا البحث تم استكشاف جدوى استخدام فكرة ارتباط الموضعي لتحديد افضل نموذج الكوبيلا و الأكثر كفاءة ، والذي يستخدم لبناء دالة ويبل ثنائي المتغير كدالة وقت البقاء ثنائي المتغير، من بين بعض انواع الآرخميديان كوبيلا. لتقييم كفاءة طريقة المقترحة ، تم تنفيذ دراسة محاكاة، وقد ثبت أن هذا طريقة مفيد للحالات العملية وقابل للتطبيق على مجموعات البيانات الحقيقية. و عند تنفيذ الإجراء المقترحة، على بيانات فعلية، على بيانات دراسة اعتلال الشبكية السكري وجد أن العيون المعالجة لديها فرصة أكبر لعدم فقدان البصر مقارنة بالعين غير المعالجة.In multivariate survival analysis, estimating the multivariate distribution functions and then measuring the association between survival times are of great interest. Copula functions, such as Archimedean Copulas, are commonly used to estimate the unknown bivariate distributions based on known marginal functions. In this paper the feasibility of using the idea of local dependence to identify the most efficient copula model, which is used to construct a bivariate Weibull distribution for bivariate Survival times, among some Archimedean copulas is explored. Furthermore, to evaluate the efficiency of the proposed procedure, a simulation study is implemented. It is shown that this approach is useful for practical situations and applicable for real datasets. Moreover, when the proposed procedure implemented on Diabetic Retinopathy Study (DRS) data, it is found that treated eyes have greater chance for non-blindness compared to untreated eyes
    corecore